Current Issue : July-September Volume : 2024 Issue Number : 3 Articles : 5 Articles
When wireless communication networks encounter jamming attacks, they experience spectrum resource occupation and data communication failures. In order to address this issue, an anti-jamming algorithm based on distributed multi-agent reinforcement learning is proposed. Each terminal observes the spectrum state of the environment and takes it as an input. The algorithm then employs Q-learning, along with the primary and backup channel allocation rules, to finalize the selection of the communication channel. The proposed algorithm designs primary and backup channel allocation rules for sweep jamming and smart jamming strategies. It can predict the behavior of jammers while reducing decision conflicts among terminals. The simulation results demonstrate that, in comparison to existing methods, the proposed algorithm not only enhances data transmission success rates across multiple scenarios but also exhibits superior operational efficiency when confronted with jamming attacks. Overall, the anti-jamming performance of the proposed algorithm outperforms the comparison methods....
Massive multiple-input-multiple-output (M-MIMO) offers remarkable advantages in terms of spectral, energy, and hardware efficiency for future wireless systems. However, its performance relies on the accuracy of channel state information (CSI) available at the transceivers. This makes channel estimation pivotal in the context of M-MIMO systems. Prior research has focused on evaluating channel estimation methods under the assumption of spatially uncorrelated fading channel models. In this study, we evaluate the performance of the minimum-mean-square-error (MMSE) estimator in terms of the normalized mean square error (NMSE) in the uplink of M-MIMO systems over spatially correlated Rician fading. The NMSE allows for easy comparison of different M-MIMO configurations, serving as a relative performance indicator. Besides, it is an advantageous metric due to its normalization, scale invariance, and consistent performance indication across diverse scenarios. In the system model, we assume imperfections in channel estimation and that the random angles in the correlation model follow a Gaussian distribution. For this scenario, we derive an accurate closed-form expression for calculating the NMSE, which is validated via Monte-Carlo simulations. Our numerical results reveal that as the Rician K-factor decreases, approaching Rayleigh fading conditions, the NMSE improves. Additionally, spatial correlation and a reduction in the antenna array interelement spacing lead to a reduction in NMSE, further enhancing the overall system performance....
A Hybrid LiFi andWiFi network (HLWNet) integrates the rapid data transmission capabilities of Light Fidelity (LiFi) with the extensive connectivity provided byWireless Fidelity (WiFi), resulting in significant benefits for wireless data transmissions in the designated area. However, the challenge of decision-making during the handover process in HLWNet is made more complex due to the specific characteristics of electromagnetic signals’ line-of-sight transmission, resulting in a greater level of intricacy compared to previous heterogeneous networks. This research work addresses the problem of handover decisions in the Hybrid LiFi and WiFi networks and treats it as a binary classification problem. Consequently, it proposes a handover method based on a deep neural network (DNN). The comprehensive handover scheme incorporates two sets of neural networks (ANN and DNN) that utilize input factors such as channel quality and the mobility of users to enable informed decisions during handovers. Following training with labeled datasets, the neural-network-based handover approach achieves an accuracy rate exceeding 95%. A comparative analysis of the proposed scheme against the benchmark reveals that the proposed method considerably increases user throughput by approximately 18.58% to 38.5% while reducing the handover rate by approximately 55.21% to 67.15% compared to the benchmark artificial neural network (ANN); moreover, the proposed method demonstrates robustness in the face of variations in user mobility and channel conditions....
Fifth-generation (5G) technology is one of the keys to the Industrial Revolution known as Industry 4.0 as it provides faster connectivity and allows a greater number of devices to be connected simultaneously. In the transport sector, newly produced vehicles are equipped with various sensors and applications to help drivers perform safe maneuvers. However, moving from semiautonomous to fully autonomous vehicles to cooperating systems remains a major challenge. Many researchers have focused on artificial intelligence (AI) techniques and the ability to share information to achieve this cooperative behavior. This information can be made up of different data, which can be obtained from different sensors such as laser imaging detection and ranging (LiDAR), radar, camera, global positioning system (GPS), or data related to the current speed, acceleration, or position. The combination of the different shared data is performed depending on the approach of each navigation algorithm. This data fusion will allow a better understanding of the environment but will overload the network, as the traffic generated will be massive. Therefore, this paper addresses the challenge of achieving this cooperation between vehicles from the point of view of network requirements and computational capacity. In addition, this study contributes to advancing theory into real-world practice by examining the performance of cooperative navigation algorithms in the midst of the migration of computational resources from onboard vehicle equipment to the cloud. In particular, it investigates the transition from a cooperative navigation algorithm based on a decentralized architecture to a semidecentralized one as computationally demanding processes previously performed onboard are performed in the cloud. Additionally, the paper discusses the indispensable role of 5G in fulfilling the escalating demands for high throughput and low latency in these services, particularly as the number of vehicles increases. The results of the tests show that the AI acting alone cannot achieve optimal performance, even using 100% of the computational capacity of the onboard equipment in the vehicle. However, a system that integrates 5G and AI-based joint decisions can achieve better performance, reduce the computational resources consumed in the vehicle, and increase the efficiency of collaborative choices by up to 83.3%....
This new antenna, called the reconfigurable holographic surface (RHS), is lightweight and compact, and it can precisely steer many beams at once. Because of its reflecting characteristic, it differs from the reconfigurable intelligent surface (RIS), which is frequently employed as a passive relay. To leverage the holographic technology and generate the necessary beam, RHS is most likely to be integrated with the transceiver as an ultra-thin and lightweight planer antenna. This has enormous potential to satisfy the growing demands of the future generation network. This paper is the first to study a wireless secrecy communication system with a base station that has and is helped by an RHS. We suggest a strategy for simultaneously optimizing the holographic beamforming at the RHS and the digital beamforming at the base station with the introduction of artificial noise (AN) to attain the highest secrecy rate. However, because of its non-convexity and changeable coupling, this problem is challenging to solve. A proficient algorithm that utilizes alternating optimization and is capable of solving the problem below the ideal level is suggested. According to simulation studies, RHS outperforms RIS in terms of enhancing the performance of wireless secrecy communication systems, indicating that RHS has a wide range of potential applications in the realm of physical layer security....
Loading....